Ankur Moitra
Institute for Advanced Study
Abstract:
In the Nonnegative Matrix Factorization problem we are given an $n \times m$ nonnegative matrix $M$ and an integer $r > 0$. Our goal is to express $M$ (either exactly or approximately) as $A W$ where $A$ and $W$ are nonnegative matrices of size $n \times r$ and $r \times m$ respectively. This problem has a rich history spanning many fields and in the past decade has become enormously popular in machine learning. One of the most compelling applications is in topic modeling, where such a factorization is used to decompose a large collection of articles into constituent topics.
The prevailing approach is to compute $A$ and $W$ using a variety of local search heuristics, which are known to fail on worst-case inputs.
The focus of this talk is on two basic questions: Is there a polynomial time algorithm that succeeds for all inputs? And can we give a theoretical explanation for why heuristics are so effective?
Our first result is an exact algorithm for Nonnegative Matrix Factorization that runs in polynomial time for every constant $r$. We prove a complementary hardness result that under the Exponential Time Hypothesis, no exact algorithm can run in time $(nm)^{o(r)}$. Lastly, we identify a condition on the input (called separability) under which we give an algorithm that runs in time polynomial in all of the parameters $n$, $m$, and $r$. Subsequently, we have implemented our algorithm and applied it to a collection of New York Times articles, and found that not only does our algorithm recover very high quality topics, but that the running time is fifty times faster than existing approaches.
This talk is based on joint work with Sanjeev Arora, Rong Ge and Ravi Kannan.